11 research outputs found

    Conditional reconstruction: An alternative strategy in digital rock physics

    Get PDF
    Digital rock physics (DRP) is a newly developed method based on imaging and digitizing of 3D pore and mineral structure of actual rock and numerically computing rock physical properties, such as permeability, elastic moduli, and formation factor. Modern high-resolution microcomputed tomography scanners are used for imaging, but these devices are not widely available, and 3D imaging is also costly and it is a time-consuming procedure. However, recent improvements of 3D reconstruction algorithms such as crosscorrelation-based simulation and, on the other side, the concept of rock physical trends have provided some new avenues in DRP. We have developed a modified work flow using higher order statistical methods. First, a high-resolution 2D image is divided into smaller subimages. Then, different stochastic subsamples are generated based on the provided 2D subimages. Eventually, various rock physical parameters are calculated. Using several subsamples allows extracting rock physical trends and better capturing the heterogeneity and variability. We implemented our work flow on two DRP benchmark data (Berea sandstone and Grosmont carbonate) and a thin-section image from the Grosmont carbonate formation. Results of realization models, pore network modeling, and autocorrelation functions for the real and reconstructed subsamples reveal the validity of the reconstructed models. Furthermore, the agreement between static and dynamic methods indicates that subsamples are representative volume elements. Average values of the subsamples’ properties follow the reference trends of the rock sample. Permeability trends pass the actual results of the benchmark samples; however, elastic moduli trends find higher values. The latter can be due to image resolution and voxel size, which are generated by imaging tools and reconstruction algorithms. According to the obtained results, this strategy can be introduced as a valid and accurate method where an alternative method for standard DRP is needed

    Conditional reconstruction: An alternative strategy in digital rock physics

    Get PDF
    Digital rock physics (DRP) is a newly developed method based on imaging and digitizing of 3D pore and mineral structure of actual rock and numerically computing rock physical properties, such as permeability, elastic moduli, and formation factor. Modern high-resolution microcomputed tomography scanners are used for imaging, but these devices are not widely available, and 3D imaging is also costly and it is a time-consuming procedure. However, recent improvements of 3D reconstruction algorithms such as crosscorrelation-based simulation and, on the other side, the concept of rock physical trends have provided some new avenues in DRP. We have developed a modified work flow using higher order statistical methods. First, a high-resolution 2D image is divided into smaller subimages. Then, different stochastic subsamples are generated based on the provided 2D subimages. Eventually, various rock physical parameters are calculated. Using several subsamples allows extracting rock physical trends and better capturing the heterogeneity and variability. We implemented our work flow on two DRP benchmark data (Berea sandstone and Grosmont carbonate) and a thin-section image from the Grosmont carbonate formation. Results of realization models, pore network modeling, and autocorrelation functions for the real and reconstructed subsamples reveal the validity of the reconstructed models. Furthermore, the agreement between static and dynamic methods indicates that subsamples are representative volume elements. Average values of the subsamples’ properties follow the reference trends of the rock sample. Permeability trends pass the actual results of the benchmark samples; however, elastic moduli trends find higher values. The latter can be due to image resolution and voxel size, which are generated by imaging tools and reconstruction algorithms. According to the obtained results, this strategy can be introduced as a valid and accurate method where an alternative method for standard DRP is needed

    Explainable machine learning for labquake prediction using catalog-driven features

    Get PDF
    Recently, Machine learning (ML) has been widely utilized for laboratory earthquake (labquake) prediction using various types of data. This study pioneers in time to failure (TTF) prediction based on ML using acoustic emission (AE) records from three laboratory stick-slip experiments performed on Westerly granite samples with naturally fractured rough faults, more similar to the heterogeneous fault structures in the nature. 47 catalog-driven seismo-mechanical and statistical features are extracted introducing some new features based on focal mechanism. A regression voting ensemble of Long-Short Term Memory (LSTM) networks predicts TTF with a coefficient of determination (R2) of 70% on the test dataset. Feature importance analysis revealed that AE rate, correlation integral, event proximity, and focal mechanism-based features are the most important features for TTF prediction. Results reveal that the network uses all information among the features for prediction, including general trends in high correlated features as well as fine details about local variations and fault evolution involved in low correlated features. Therefore, some highly correlated and physically meaningful features may be considered less important for TTF prediction due to their correlation with other important features. Our study provides a ground for applying catalog-driven to constrain TTF of complex heterogeneous rough faults, which is capable to be developed for real application

    A Hierarchical Sampling for Capturing Permeability Trend in Rock Physics

    No full text
    Among all properties of reservoir rocks, permeability is an important parameter which is typically measured from core samples in the laboratory. Due to limitations of core drilling all over a reservoir, simulation of rock porous media is demanded to explore more scenarios not seen in the available data. One of the most accurate methods is cross correlation based simulation (CCSIM) which recently has broadly applied in geoscience and porous media. The purpose of this study is producing realizations with the same permeability trend to a real sample. Berea sandstone sample is selected for this aim. Permeability results, extracted from smaller sub-samples of the original sample, showed that classic Kozeny–Carman permeability trend is not suitable for this sample. One reason can be due to lack of including geometrical and fractal properties of pore-space distribution in this equation. Thus, a general trend based on fractal dimensions of pore-space and tortuosity of the Berea sample is applied in this paper. Results show that direct 3D stochastic modeling of porous media preserves porous structure and fractal behavior of rock. On the other hand, using only 2D images for constructing the 3D pore structures does not reproduce the measured experimental permeability. For this aim, a hierarchical sampling is implemented in two and three steps using both 2D and 3D stochastic modeling. Results showed that two-step sampling is not suitable enough, while the utilized three-step sampling occurs to be show excellent performance by which different models of porous media with the same permeability trend as the Berea sandstone sample can be generated

    Single-Station Coda Wave Interferometry: A Feasibility Study Using Machine Learning

    No full text
    Coda wave interferometry usually is applied with pairs of stations analyzing the signal transmitted from one station to another. A feasibility study was performed to evaluate if one single station could be used. In this case, the reflected coda wave signal from a zone to be identified was analyzed. Finite-difference simulations of wave propagation were used to study whether ultrasonic measurements could be used to detect velocity changes in such a zone up to a depth of 1.6 m in a highly scattering medium. For this aim, 1D convolutional neural networks were used for prediction. The crack density, the crack length, and the intrinsic attenuation were varied in the considered background material. The influence of noise and the sensor width was elaborated as well. It was shown that, in general, the suggested single-station approach is a possible way to identify damage zones, and the method was robust against the studied variations. The suggested workflow also took advantage of machine-learning techniques, and can be transferred to the detection of defects in concrete structures

    High performance of the support vector machine in classifying hyperspectral data using a limited dataset

    No full text
    To prospect mineral deposits at regional scale, recognition and classification of hydrothermal alteration zones using remote sensing data is a popular strategy. Due to the large number of spectral bands, classification of the hyperspectral data may be negatively affected by the Hughes phenomenon. A practical way to handle the Hughes problem is preparing a lot of training samples until the size of the training set is adequate and comparable with the number of the spectral bands. In order to gather adequate ground truth instances as training samples, a time-consuming and costly ground survey operation is needed. In this situation that preparing enough field samples is not an easy task, using an appropriate classifier which can properly work with a limited training dataset is highly desirable. Among the supervised classification methods, the Support Vector Machine is known as a promising classifier that can produce acceptable results even with limited training data. Here, this capability is evaluated when the SVM is used to classify the alteration zones of Darrehzar district. For this purpose, only 12 sampled instances from the study area are utilized to classify Hyperion hyperspectral data with 165 useable spectral bands. Results demonstrate that if parameters of the SVM, namely C and σ, are accurately adjusted, the SVM can be successfully used to identify alteration zones when field data samples are not available enough

    Stochastic modeling of coal fracture network by direct use of micro-computed tomography images

    No full text
    Characterization of coalbed methane reservoirs is a challenging task because of complex petrophysical properties of coal. Coal cleat system has a key role in permeability of gas through coalbed. Previous computational methods for characterization and modeling in coal formations do not account for the actual complexities in cleat systems as they commonly rely on simple statistical properties for describing the fractures. In this study, unlike the previous methods that try to extract some of the spatial statistical properties, the 2D/3D micro computed tomography images are used directly without any simplifications and assumptions. The generated models are compared to discrete fracture networks as one of the widely-used method for the modeling of such complex systems of coal cleats. Results show that the utilized algorithm produces visually satisfactory realizations of both coal matrix and cleat system. To quantify such similarities, autocorrelation functions, connectivity (with two distinct indices), average fracture length and orientation are computed. Moreover, permeabilities and porosities of the reconstructed samples are calculated and compared with the original sample. It is demonstrated that the proposed reconstruction method reproduces samples with similar statistical and petrophysical properties, but with different patterns of both coal porous region and fracture system. Finally, the proposed method and the DFN realizations are also compared extensively. The results of this study can be used for characterization of coal samples with any degree of complexity and heterogeneity by producing several realistic stochastic models. Consequently, petrophysical properties and their corresponding uncertainties can be evaluated more accurately

    Using a Feature Subset Selection method and Support Vector Machine to address curse of dimensionality and redundancy in Hyperion hyperspectral data classification

    No full text
    The curse of dimensionality resulted from insufficient training samples and redundancy is considered as an important problem in the supervised classification of hyperspectral data. This problem can be handled by Feature Subset Selection (FSS) methods and Support Vector Machine (SVM). The FSS methods can manage the redundancy by removing redundant spectral bands. Moreover, kernel based methods, especially SVM have a high ability to classify limited-sample data sets. This paper mainly aims to assess the capability of a FSS method and the SVM in curse of dimensional circumstances and to compare results with the Artificial Neural Network (ANN), when they are used to classify alteration zones of the Hyperion hyperspectral image acquired from the greatest Iranian porphyry copper complex. The results demonstrated that by decreasing training samples, the accuracy of SVM was just decreased 1.8% while the accuracy of ANN was highly reduced i.e. 14.01%. In addition, a hybrid FSS was applied to reduce the dimension of Hyperion. Accordingly, among the 165 useable spectral bands of Hyperion, 18 bands were only selected as the most important and informative bands. Although this dimensionality reduction could not intensively improve the performance of SVM, ANN revealed a significant improvement in the computational time and a slightly enhancement in the average accuracy. Therefore, SVM as a low-sensitive method respect to the size of training data set and feature space can be applied to classify the curse of dimensional problems. Also, the FSS methods can improve the performance of non-kernel based classifiers by eliminating redundant features. Keywords: Curse of dimensionality, Feature Subset Selection, Hydrothermal alteration, Hyperspectral, SV
    corecore